Analysis of a self-scaling quasi-Newton method
نویسندگان
چکیده
We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in addition, the search directions were well-scaled, but we show that this is not always the case. We find that the method has a major drawback: to achieve superlinear convergence it may be necessary to evaluate the function twice per iteration, even very near the solution. An example is constructed to show that the step-sizes required to achieve a superlinear rate converge to 2 and 0.5 alternately.
منابع مشابه
Wide interval for efficient self-scaling quasi-Newton algorithms
This paper uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstraiend optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from the so-called convex class, but increase their c...
متن کاملThe Global Convergence of Self-Scaling BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is...
متن کاملThe Global Convergence of Self-scale BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is...
متن کاملA Self-Correcting Variable-Metric Algorithm for Stochastic Optimization
An algorithm for stochastic (convex or nonconvex) optimization is presented. The algorithm is variable-metric in the sense that, in each iteration, the step is computed through the product of a symmetric positive definite scaling matrix and a stochastic (mini-batch) gradient of the objective function, where the sequence of scaling matrices is updated dynamically by the algorithm. A key feature ...
متن کاملA combined class of self-scaling and modified quasi-Newton methods
Techniques for obtaining safely positive definite Hessian approximations with selfscaling and modified quasi-Newton updates are combined to obtain ‘better’ curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method has global and superlinear convergence for convex functions. Numerical experiments with this class, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Math. Program.
دوره 61 شماره
صفحات -
تاریخ انتشار 1993